Newton’s Method for M-Tensor Equations
نویسندگان
چکیده
We are concerned with the tensor equations whose coefficient tensors M-tensors. first propose a Newton method for solving equation positive constant term and establish its global quadratic convergence. Then we extend to solve nonnegative At last, do numerical experiments test proposed methods. The results show that methods quite efficient.
منابع مشابه
ABS-Type Methods for Solving $m$ Linear Equations in $frac{m}{k}$ Steps for $k=1,2,cdots,m$
The ABS methods, introduced by Abaffy, Broyden and Spedicato, aredirect iteration methods for solving a linear system where the$i$-th iteration satisfies the first $i$ equations, therefore a system of $m$ equations is solved in at most $m$ steps. In thispaper, we introduce a class of ABS-type methods for solving a full rowrank linear equations, w...
متن کاملNumerical Method for Three-parameter Eigenvalue Problems using Newtons method based on Trace Theorem
Atkinson, F. V. ,1972. 'Multiparameter Eigenvalue Problems', (Matrices and compact operators) Academic Press, New York, Vol. 1 Atkinson, F. V. , 1968. 'Multiparameter spectral theory', Bull. Am. Math. Soc. , Vol. 75, pp(1-28) Baruah, A. K. , 1987. 'Estimation of eigen elements in a two-parameter eigen value problem', Ph. D Thesis, Dibrugarh University, Assam. Bindi...
متن کاملResidual norm steepest descent based iterative algorithms for Sylvester tensor equations
Consider the following consistent Sylvester tensor equation[mathscr{X}times_1 A +mathscr{X}times_2 B+mathscr{X}times_3 C=mathscr{D},]where the matrices $A,B, C$ and the tensor $mathscr{D}$ are given and $mathscr{X}$ is the unknown tensor. The current paper concerns with examining a simple and neat framework for accelerating the speed of convergence of the gradient-based iterative algorithm and ...
متن کاملTensor-Krylov methods for large nonlinear equations
In this paper, we describe tensor methods for large systems of nonlinear equations based on Krylov subspace techniques for approximately solving the linear systems that are required in each tensor iteration. We refer to a method in this class as a tensor-Krylov algorithm. We describe comparative testing for a tensor-Krylov implementation versus an analogous implementation based on a Newton-Kryl...
متن کاملAn Improved Gauss-Newtons Method based Back-propagation Algorithm for Fast Convergence
The present work deals with an improved back-propagation algorithm based on Gauss-Newton numerical optimization method for fast convergence. The steepest descent method is used for the back-propagation. The algorithm is tested using various datasets and compared with the steepest descent back-propagation algorithm. In the system, optimization is carried out using multilayer neural network. The ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Journal of Optimization Theory and Applications
سال: 2021
ISSN: ['0022-3239', '1573-2878']
DOI: https://doi.org/10.1007/s10957-021-01904-0